42 research outputs found

    Homotopy Theoretic Models of Type Theory

    Full text link
    We introduce the notion of a logical model category which is a Quillen model category satisfying some additional conditions. Those conditions provide enough expressive power that one can soundly interpret dependent products and sums in it. On the other hand, those conditions are easy to check and provide a wide class of models some of which are listed in the paper.Comment: Corrected version of the published articl

    Particle Acceleration in Cosmic Sites - Astrophysics Issues in our Understanding of Cosmic Rays

    Full text link
    Laboratory experiments to explore plasma conditions and stimulated particle acceleration can illuminate aspects of the cosmic particle acceleration process. Here we discuss the cosmic-ray candidate source object variety, and what has been learned about their particle-acceleration characteristics. We identify open issues as discussed among astrophysicists. -- The cosmic ray differential intensity spectrum is a rather smooth power-law spectrum, with two kinks at the "knee" (~10^15 eV) and at the "ankle" (~3 10^18 eV). It is unclear if these kinks are related to boundaries between different dominating sources, or rather related to characteristics of cosmic-ray propagation. We believe that Galactic sources dominate up to 10^17 eV or even above, and the extragalactic origin of cosmic rays at highest energies merges rather smoothly with Galactic contributions throughout the 10^15--10^18 eV range. Pulsars and supernova remnants are among the prime candidates for Galactic cosmic-ray production, while nuclei of active galaxies are considered best candidates to produce ultrahigh-energy cosmic rays of extragalactic origin. Acceleration processes are related to shocks from violent ejections of matter from energetic sources such as supernova explosions or matter accretion onto black holes. Details of such acceleration are difficult, as relativistic particles modify the structure of the shock, and simple approximations or perturbation calculations are unsatisfactory. This is where laboratory plasma experiments are expected to contribute, to enlighten the non-linear processes which occur under such conditions.Comment: accepted for publication in EPJD, topical issue on Fundamental physics and ultra-high laser fields. From review talk at "Extreme Light Infrastructure" workshop, Sep 2008. Version-2 May 2009: adjust some wordings and references at EPJD proofs stag

    Relic neutrino masses and the highest energy cosmic rays

    Get PDF
    We consider the possibility that a large fraction of the ultrahigh energy cosmic rays are decay products of Z bosons which were produced in the scattering of ultrahigh energy cosmic neutrinos on cosmological relic neutrinos. We compare the observed ultrahigh energy cosmic ray spectrum with the one predicted in the above Z-burst scenario and determine the required mass of the heaviest relic neutrino as well as the necessary ultrahigh energy cosmic neutrino flux via a maximum likelihood analysis. We show that the value of the neutrino mass obtained in this way is fairly robust against variations in presently unknown quantities, like the amount of neutrino clustering, the universal radio background, and the extragalactic magnetic field, within their anticipated uncertainties. Much stronger systematics arises from different possible assumptions about the diffuse background of ordinary cosmic rays from unresolved astrophysical sources. In the most plausible case that these ordinary cosmic rays are protons of extragalactic origin, one is lead to a required neutrino mass in the range 0.08 eV - 1.3 eV at the 68 % confidence level. This range narrows down considerably if a particular universal radio background is assumed, e.g. to 0.08 eV - 0.40 eV for a large one. The required flux of ultrahigh energy cosmic neutrinos near the resonant energy should be detected in the near future by AMANDA, RICE, and the Pierre Auger Observatory, otherwise the Z-burst scenario will be ruled out.Comment: 19 pages, 22 figures, REVTeX

    Energy and Flux Measurements of Ultra-High Energy Cosmic Rays Observed During the First ANITA Flight

    Get PDF
    The first flight of the Antarctic Impulsive Transient Antenna (ANITA) experiment recorded 16 radio signals that were emitted by cosmic-ray induced air showers. For 14 of these events, this radiation was reflected from the ice. The dominant contribution to the radiation from the deflection of positrons and electrons in the geomagnetic field, which is beamed in the direction of motion of the air shower. This radiation is reflected from the ice and subsequently detected by the ANITA experiment at a flight altitude of 36km. In this paper, we estimate the energy of the 14 individual events and find that the mean energy of the cosmic-ray sample is 2.9 EeV. By simulating the ANITA flight, we calculate its exposure for ultra-high energy cosmic rays. We estimate for the first time the cosmic-ray flux derived only from radio observations. In addition, we find that the Monte Carlo simulation of the ANITA data set is in agreement with the total number of observed events and with the properties of those events.Comment: Added more explanation of the experimental setup and textual improvement

    The Antarctic Impulsive Transient Antenna Ultra-high Energy Neutrino Detector Design, Performance, and Sensitivity for 2006-2007 Balloon Flight

    Full text link
    We present a detailed report on the experimental details of the Antarctic Impulsive Transient Antenna (ANITA) long duration balloon payload, including the design philosophy and realization, physics simulations, performance of the instrument during its first Antarctic flight completed in January of 2007, and expectations for the limiting neutrino detection sensitivity. Neutrino physics results will be reported separately.Comment: 50 pages, 49 figures, in preparation for PR

    Graph Neural Networks for low-energy event classification & reconstruction in IceCube

    Get PDF
    IceCube, a cubic-kilometer array of optical sensors built to detect atmospheric and astrophysical neutrinos between 1 GeV and 1 PeV, is deployed 1.45 km to 2.45 km below the surface of the ice sheet at the South Pole. The classification and reconstruction of events from the in-ice detectors play a central role in the analysis of data from IceCube. Reconstructing and classifying events is a challenge due to the irregular detector geometry, inhomogeneous scattering and absorption of light in the ice and, below 100 GeV, the relatively low number of signal photons produced per event. To address this challenge, it is possible to represent IceCube events as point cloud graphs and use a Graph Neural Network (GNN) as the classification and reconstruction method. The GNN is capable of distinguishing neutrino events from cosmic-ray backgrounds, classifying different neutrino event types, and reconstructing the deposited energy, direction and interaction vertex. Based on simulation, we provide a comparison in the 1 GeV–100 GeV energy range to the current state-of-the-art maximum likelihood techniques used in current IceCube analyses, including the effects of known systematic uncertainties. For neutrino event classification, the GNN increases the signal efficiency by 18% at a fixed background rate, compared to current IceCube methods. Alternatively, the GNN offers a reduction of the background (i.e. false positive) rate by over a factor 8 (to below half a percent) at a fixed signal efficiency. For the reconstruction of energy, direction, and interaction vertex, the resolution improves by an average of 13%–20% compared to current maximum likelihood techniques in the energy range of 1 GeV–30 GeV. The GNN, when run on a GPU, is capable of processing IceCube events at a rate nearly double of the median IceCube trigger rate of 2.7 kHz, which opens the possibility of using low energy neutrinos in online searches for transient events.Peer Reviewe

    A muon-track reconstruction exploiting stochastic losses for large-scale Cherenkov detectors

    Get PDF
    IceCube is a cubic-kilometer Cherenkov telescope operating at the South Pole. The main goal of IceCube is the detection of astrophysical neutrinos and the identification of their sources. High-energy muon neutrinos are observed via the secondary muons produced in charge current interactions with nuclei in the ice. Currently, the best performing muon track directional reconstruction is based on a maximum likelihood method using the arrival time distribution of Cherenkov photons registered by the experiment\u27s photomultipliers. A known systematic shortcoming of the prevailing method is to assume a continuous energy loss along the muon track. However at energies >1 TeV the light yield from muons is dominated by stochastic showers. This paper discusses a generalized ansatz where the expected arrival time distribution is parametrized by a stochastic muon energy loss pattern. This more realistic parametrization of the loss profile leads to an improvement of the muon angular resolution of up to 20% for through-going tracks and up to a factor 2 for starting tracks over existing algorithms. Additionally, the procedure to estimate the directional reconstruction uncertainty has been improved to be more robust against numerical errors

    Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping

    Get PDF
    Contains fulltext : 182408.pdf (publisher's version ) (Open Access)BACKGROUND: Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct. It built upon a systematic review of the literature and semi-structured stakeholder interviews that generated 47 criteria for pragmatic measures, and aimed to further refine that set of criteria by identifying conceptually distinct categories of the pragmatic measure construct and providing quantitative ratings of the criteria's clarity and importance. METHODS: Twenty-four stakeholders with expertise in implementation practice completed a concept mapping activity wherein they organized the initial list of 47 criteria into conceptually distinct categories and rated their clarity and importance. Multidimensional scaling, hierarchical cluster analysis, and descriptive statistics were used to analyze the data. FINDINGS: The 47 criteria were meaningfully grouped into four distinct categories: (1) acceptable, (2) compatible, (3) easy, and (4) useful. Average ratings of clarity and importance at the category and individual criteria level will be presented. CONCLUSIONS: This study advances the field of implementation science and practice by providing clear and conceptually distinct domains of the pragmatic measure construct. Next steps will include a Delphi process to develop consensus on the most important criteria and the development of quantifiable pragmatic rating criteria that can be used to assess measures
    corecore